The Week That Was Dec. 11, 2004 1. New on the Web: MICHAEL SHIBA DESCRIBES VARIOUS INITIATIVES BY
STATE LEGISLATORS AND OFFICIALS (GOVERNORS, ATTORNEYS GENERAL, AND EVEN
MAYORS) TO INSTITUTE UNILATERAL EMISSION CONTROLS ON CARBON DIOXIDE THAT
MIMIC THE KYOTO PROTOCOL, WHICH THE FEDERAL GOVERNMENT HAS TURNED DOWN.
2. AUTOMAKERS TAKE CALIFORNIA'S CLIMATE EMISSIONS RULE TO COURT 3. U.S. PIRG URGES SUPPORTERS TO BEGIN CALL-IN CAMPAIGN AGAINST ARCTIC OIL EXPLORATION 4. HOMELAND SECURITY ISSUES MORE REASONABLE RADIATION GUIDELINES 5. GREENPEACE CO-FOUNDER SAYS ORGANIZATION HAS LOST ITS WAY 6. HYDROGEN PRODUCTION METHOD COULD BOLSTER FUEL SUPPLIES 7. MERCURY REDUCTION RULES ARE FISHY 8. FREAK WEATHER EVENTS NOT RELATED TO GLOBAL WARMING 9. MUCH ADO ABOUT FU: THE SATELLITE SAGA CONTINUES
FRESNO, California, December 8, 2004 (ENS) - A coalition of the nation's
largest carmakers filed suit in federal court in Fresno, California, Tuesday
challenging California's new standards for vehicle emissions of greenhouse
gases linked to global warming. The Alliance of Automobile Manufacturers
argues that Californians would pay "an average of $3,000 more"
for a new automobile and "would never recoup those extra, up-front
dollars through savings at the gas pump." Automakers Sue to Block Emissions Law in California
The regulation - the first of its kind in North America - would require
automakers to cut by roughly 30 percent the greenhouse gas emissions from
cars and trucks sold in the state by the 2016 model year. The industry
is suing in federal court in Fresno, Calif., contending that California's
regulation is pre-empted by Washington's authority to regulate fuel economy.
Greenhouse gas emissions from cars and trucks are a function of fuel economy.
An overview of the new draft "protective action guidelines" recommended by the Department of Homeland Security: First-Responder Exposure: Over the course of the initial event, the new
guidelines say it's safe for firemen, police and EMTs to receive a total
exposure of five rem. That's the equivalent of 5,000 dental X-rays, or
20 times the radiation people normally are exposed to in a year from natural
Committee To Bridge The Gap Doses Equivalent to Tens of Thousands of Chest X-rays Could be Allowed, WASHINGTON, DC - Dec 2, 2004 More than 50 public policy organizations today called on the Department of Homeland Security (DHS) to halt plans to dramatically weaken requirements for cleaning up radioactive contamination from a terrorist radiological or nuclear explosive. The groups disclosed that DHS is about to release new guidance that could permit ongoing contamination at levels equivalent to a person receiving tens of thousands of chest X-rays over thirty years. Official government risk figures estimate that as many as a quarter of the people exposed to such doses would develop cancer. In a letter to outgoing DHS Secretary Tom Ridge, the groups said, "An attack by a terrorist group using a 'dirty bomb' or improvised nuclear device would be a terrible tragedy. . . .But should such a radiological weapon go off in the U.S, our government should not compound the situation by employment of standards for cleaning up the radioactive contamination that are inadequately protective of the public." "Far from protecting us from the potentially catastrophic health effects of a terrorist dirty bomb, by permitting such high radiation levels to remain without cleanup, Homeland Security would actually be increasing the casualty count," said Diane D'Arrigo, Radioactive Waste Project Director at Nuclear Information and Resource Service. "Approval of this guidance would also set a dangerous precedent to weaken the already inadequate cleanup standards for nuclear-contaminated sites across this country." "Benchmark" cleanup standards contemplated in the DHS guidance are up to 2500 times less protective than the risk levels considered by EPA as barely acceptable for cleanup of Superfund toxic and radioactive sites. [Comment: Or are EPA standards too strict?] "We recognize that response actions in the immediate aftermath of a terrorist incident may require extraordinary measures and doses," said Daniel Hirsch, President of the Committee to Bridge the Gap and initiator of the group letter, "However, it is unacceptable to set final cleanup goals so lax that long-term cancer risks are hundreds of times higher than currently accepted for remediation of the nation's most contaminated sites." In a parallel letter to Environmental Protection Agency, the groups urged Administrator Michael Leavitt to resist any effort to establish cleanup standards that permit public risks significantly outside EPA's longstanding legally allowable risk range. Signers include Committee to Bridge the Gap, Nuclear Information and Resource Service, Union of Concerned Scientists, Sierra Club, Physicians for Social Responsibility, Public Citizen, and Greenpeace. SEPP Comment: By now our readers should be aware of the unscientific
use of the LNT
(linear no-threshold) hypothesis. But just to remind:
Patrick Moore, co-founder of Greenpeace, says that he left the mainstream green movement in 1986 because it abandoned science and logic in favor of an anti-corporate, anti-globalization agenda. Moore, a Ph.D. in ecology, says instead of using science to solve problems such as whaling, nuclear testing and toxic waste, Greenpeace became more concerned about maintaining problems to further a leftist political agenda. For example, Greenpeace: O Effectively demanded that nuclear waste never be buried; but this meant more individuals would be exposed to risk as waste is shuffled from one location to another. O Opposed aquaculture and insisted society catch only a sustainable level of fish from the wild; but this would drive up the price of fish to the point where only the wealthy could afford it. O Insisted that all farming be organic; but this would make millions around the world go unfed without the availability of cost-saving agricultural technologies. Moore suggests Greenpeace's final remnants of a science-based agenda were destroyed after the fall of the Berlin Wall in 1989 due to the influx of peace activists and Marxist ideologues into the green movement. -------------------------- Source: Roger Bate, "Moore Wisdom Needed," Economic Affairs,
Vol. 24, Institute of Economic Affairs, June 2004;
WASHINGTON, Nov. 27 - Researchers at a government nuclear laboratory and a ceramics company in Salt Lake City say they have found a way to produce pure hydrogen with far less energy than other methods, raising the possibility of using nuclear power to indirectly wean the transportation system from its dependence on oil. The development would move the country closer to the Energy Department's goal of a "hydrogen economy," in which hydrogen would be created through a variety of means, and would be consumed by devices called fuel cells, to make electricity to run cars and for other purposes. Experts cite three big roadblocks to a hydrogen economy: manufacturing hydrogen cleanly and at low cost, finding a way to ship it and store it on the vehicles that use it, and reducing the astronomical price of fuel cells. "This is a breakthrough in the first part," said J. Stephen Herring, a consulting engineer at the Idaho National Engineering and Environmental Laboratory, which plans to announce the development on Monday with Cerametec Inc. of Salt Lake City. The developers also said the hydrogen could be used by oil companies to stretch oil supplies even without solving the fuel cell and transportation problems. Mr. Herring said the experimental work showed the "highest-known production rate of hydrogen by high-temperature electrolysis." But the plan requires the building of a new kind of nuclear reactor, at a time when the United States is not even building conventional reactors. And the cost estimates are uncertain. The heart of the plan is an improvement on the most convenient way to make hydrogen, which is to run electric current through water, splitting the H2O molecule into hydrogen and oxygen. This process, called electrolysis, now has a drawback: if the electricity comes from coal, which is the biggest source of power in this country, then the energy value of the ingredients - the amount of energy given off when the fuel is burned - is three and a half to four times larger than the energy value of the product. Also, carbon dioxide and nitrogen oxide emissions increase when the additional coal is burned. Hydrogen can also be made by mixing steam with natural gas and breaking apart both molecules, but the price of natural gas is rising rapidly. The new method involves running electricity through water that has a very high temperature. As the water molecule breaks up, a ceramic sieve separates the oxygen from the hydrogen. The resulting hydrogen has about half the energy value of the energy put into the process, the developers say. Such losses may be acceptable, or even desirable, because hydrogen for a nuclear reactor can be substituted for oil, which is imported and expensive, and because the basic fuel, uranium, is plentiful. The idea is to build a reactor that would heat the cooling medium in the nuclear core, in this case helium gas, to about 1,000 degrees Celsius, or more than 1,800 degrees Fahrenheit. The existing generation of reactors, used exclusively for electric generation, use water for cooling and heat it to only about 300 degrees Celsius. The hot gas would be used two ways. It would spin a turbine to make electricity, which could be run through the water being separated. And it would heat that water, to 800 degrees Celsius. But if electricity demand on the power grid ran extremely high, the hydrogen production could easily be shut down for a few hours, and all of the energy could be converted to electricity, designers say. The goal is to create a reactor that could produce about 300 megawatts of electricity for the grid, enough to run about 300,000 window air-conditioners, or produce about 2.5 kilos of hydrogen per second. When burned, a kilo of hydrogen has about the same energy value as a gallon of unleaded regular gasoline. But fuel cells, which work without burning, get about twice as much work out of each unit of fuel. So if used in automotive fuel cells, the reactor might replace more than 400,000 gallons of gasoline per day. The part of the plan that the laboratory and the ceramics company have tested is high-temperature electrolysis. There is only limited experience building high-temperature gas-cooled reactors, though, and no one in this country has ordered any kind of big reactor, even those of more conventional design, in 30 years, except for those whose construction was canceled before completion. Another problem is that the United States has no infrastructure for shipping large volumes of hydrogen. Currently, most hydrogen is produced at the point where it is used, mostly in oil refineries. Hydrogen is used to draw the sulfur out of crude oil, and to break up hydrocarbon molecules that are too big for use in liquid fuel, and change the carbon-hydrogen ratio to one more favorable for vehicle fuel. Mr. Herring suggested another use, however: recovering usable fuel from
the Athabasca Tar Sands in Alberta, Canada. The reserves there may hold
the largest oil deposits in the world, but extracting them and converting
them into a gasoline substitute requires copious amounts of steam and
hydrogen, both products of the reactor
The Environmental Protection Agency's proposed new rules for mercury reductions would cost about $1.4 billion per year, and would have a negligible impact on public health, says environmental consultant Joel Schwartz. Two proposals are being considered which would target coal-fired utility boilers for mercury reductions. However, both are costly and the monetary benefits of such reductions are unknown, which even the EPA admits, says Schwartz. O The first proposal, based mainly on a cap and trade system, would cost about $1.36 billion per year to the industry, with an estimated cost to society of about $1.6 billion. O The second proposal, a two-phase reduction, would cost $2.9 billion
in However, the benefits are questionable, particularly in light of previous studies examining the effects of mercury exposure to mothers and children on the Faroes Islands and the Seychelles, says Schwartz: O Based on the Faroes study, a total elimination of mercury emissions in the United States would improve children's health minimally: Children in the 10th percentile on neurological and cognitive test scores would move to between 10.3 and 10.6 percentiles, at best. O A study of children in Seychelles indicated no harm from mercury exposure, even though their mercury exposure was greater than the most highly-exposed Americans. O Furthermore, the new rules assume a one-to-one correspondence between mercury emissions and mercury levels in freshwater fish; but more likely fish mercury levels would decline by less than half of the amount of mercury depositions reductions. Source: Joel Schwartz, "A Regulatory Analysis of EPA's Proposed
Rule to Reduce Mercury Emissions from Utility Boilers," AEI-Brookings
Joint Center for Regulatory Studies, September 2004.
Is "freak weather", in particular heat waves in Europe 2003,
becoming more common due to global warming -- with humans held accountable.
How do these computer scientists explain the deadliest heat wave that
occurred in central and southern Canada in July 1936 (5-12 July) when
there were very few people living in Canada? This Canadian heat wave killed
more than 1100 people in a span of just 10 days, most of the deaths occurring
from heat exhaustion and dehydration due to unavailability of air-conditioned
houses. The large number of fatalities in Europe's 2003 heat wave could
have been avoided if most of those senior citizens were moved to open
areas and/or air conditioned houses, in time. The number of deaths does
not make a heat wave any worse than what the temperature structure suggests.
In 1995 July, a severe heat wave in central Illinois, USA, killed about
800 people, most of them being senior citizens living in houses with no
air-conditioners and too afraid to open windows for fear of vandalism.
A report published recently blamed lack of suitable precaution by city
officials for the large number of deaths.
The original satellite dataset produced by the University of Alabama in Huntsville (UAH) now has a warming trend of 0.08 degC/decade since 1979, while the surface thermometer trend is two to three times this value. Climate models, in contrast, claim that any surface warming as a result of greenhouse warming should be amplified with height, not reduced. This has led to varying levels of concern in the climate community that the theory contained in the climate models might be in error. As background, a study published earlier this year by Fu et al. (1) attempted to estimate the amount of tropospheric warming by a simple linear combination of the stratospheric and tropospheric channels of the Microwave Sounding Units (MSUs) flying on NOAA polar-orbiting weather satellites. (The troposphere exists from the surface up to a height of around 8-12 miles, the stratosphere overlays it.) Since the tropospheric channel has about 15% influence from the stratosphere -- which has cooled strongly since 1979 -- the tropospheric temperature can only be estimated through removal of the stratospheric component. Fu et al. used radiosonde (weather balloon) data to arrive at an optimum combination of the two channels that, when applied to the satellite-observed temperature trends, resulted in a tropospheric warming trend that was larger than that estimated by UAH with a different technique. In the first article announced this week, Fu & Johanson (2) estimate the stratospheric contribution to the satellite instrument's tropospheric channel through a slightly different method than in their original article. They used previously published radiosonde estimates of temperature trends through the lower and middle stratosphere to estimate the error in their method, as well as the amount of stratospheric cooling contained in the tropospheric channel. While we would prefer to leave detailed comments for a journal article, a couple of general points can be made. For the period they examined (1979-2001), our (UAH) lower-tropospheric temperature trend is +0.06 deg. C/decade, while their estimate of the (whole) tropospheric trend is +0.09 deg C/decade. You might notice that the difference between these two trends is small, considering the probable error bounds on these estimates and the fact that the two techniques measure somewhat different layers. Also, their method depends on belief in the radiosonde-measured trends in the lower stratosphere, even though we know there are larger errors at those altitudes than in the troposphere -- and most published radiosonde trends for the troposphere show little or no global warming (!). As is often the case, the press release that described their new study made claims that were, in my view, exaggerated. Nevertheless, given the importance of the global warming issue, this line of research is probably worthwhile as it provides an alternative way of interpreting the satellite data. The other study (3), published by Simon Tett and Peter Thorne at the UK's Hadley Centre, takes issue with the original Fu et al. method. Tett and Thorne claim that when the technique is applied to variety of radiosonde, reanalysis, and global model simulation datasets in the tropics, it leads to results that are more variable than the UAH technique produces. It also mentions the dependence of the method on the characteristics of the radiosonde data that are assumed. What all this means in terms of observed and predicted global temperature trends remains to be seen. As part of the requirements of the Bush administration's Climate Change Science Plan, a variety of scientists are now sifting through the satellite, surface thermometer, and radiosonde data, and will report in the coming year on their findings. References 1. Fu, Q., C.M. Johanson, S.G. Warren, and D.J. Seidel, 2004: Contribution of stratospheric cooling to satellite inferred tropospheric temperature trends. Nature, Vol. 429, p. 55-58. 2. Fu, Q., and C.M. Johanson, 2004. Stratospheric influences on MSU-derived tropospheric temperature trends: A direct error analysis. Journal of Climate, to be published December 15, 2004 3. Tett, S., and P. Thorne, 2004: Tropospheric temperature series from satellites. December 2, 2004, at Nature online (subscription required).
SEPP Comment: If one assumes that all of the 0.08C /decade trend
is anthropogenic, then the max temperature rise by 2100 is likely to be
only 0.8C
Pls be as generous as you can. Mail yr check to SEPP If you would like to receive our books or pamphlets,
pls so indicate. The Science & Environmental Policy Project (SEPP) is an international association of mainly physical scientists and engineers concerned with the responsible use of scientific information in the development of environmental policies .We publish scientific reports, hold briefings, give talks and seminars, and issue a weekly e-mail bulletin to some 2000 addressees. Our web address is www.sepp.org Our Priority Issues are: Climate change 1. We do not solicit support from either industry or government but receive donations and grants from private individuals and foundations. 2. Our Tax Status is as a 501(3)(c) organization. Donations are fully tax-deductible. 3. Our officers and board members do not receive salaries or fees. We have no salaried employees but use volunteers and student help. 4. We don't rent or sell readership or donor lists. We don't accept
pop-up or other advertising. SEASON'S GREETINGS TO ALL
|